Hello,
I would like to detect when the app is task killed (i.e. removed from background). I thought "applicationWillTerminate" is called at that time, but not called.
How can I detect if the app is deleted from background.
Thanks for your help in advance.
Post
Replies
Boosts
Views
Activity
I use the following push notification payload.
{
"aps": {
"sound": {
"name": "default",
"volume": 0
}
},
"type": "refresh"
}
When the iPhone received this, it sounds even though "volume" is set to "0".
Is this iOS bug? Or dose "default alert sound" of iOS17 override this?
I tested iOS17 and 16. This occurred both.
Is there any way to set no sound when iPhone received a specific push notification like "type": "refresh".
Hi there,
My app plays the "default alert sound" when received "Remote Push Notification" and other timing.
I understand the "default alert sound" plays when received "Remote Push Notification". However, it plays at some timing which is not the timing my app received "Remote Push Notification".
I have no idea what makes the "default alert sound" play.
Is there any trigger to make the "default alert sound" play other than receiving "Remote Push Notification"?
Hi,
I often use "Find" bar in Xcode. I use Japanese Hiragana and Kanji characters.
After updated to Xcode 15, Japanese Hiragana cannot be converted to Kanji correctly.
Apple, please fix this bug.
Hi,
I have a camera app which can select ISO value.
For iPhone 13 pro, the minimum ISO value is 40. Therefore, I set the initial ISO value to 40.
Now, I replaced my iPhone 13 pro with the latest model iPhone 15 pro, and found the minimum ISO value for iPhone 15 is 55.
My camera app can be use with iPhone 15 as well as iPhone 15 pro. Unfortunately I do not have an iPhone 15, so I cannot get the range of ISO value by using actual device.
I searched the net, but there is no information regarding ISO range for iPhone 15 series.
For that reason, I would like to know the range of ISO value for the iPhone 15.
If anyone knows it, please let me know.
Thanks for your help in advance.
In the app using SwiftUI, I disabled a button using ".disabled()". In iOS 14, when I tap the button, it seems disabled but the button is not blurred.
It works fine in iOS 15 and 16.
Is this iOS 14 or SwiftUI bug?
If not a bug, I would like to know how to make the button blurred in iOS 14.
LAPolicy.deviceOwnerAuthenticationWithBiometrics always returns "success" when back from background.
I would like to use Biometrics authentication when my iOS app comes back from background. I added this process but it always returns "success".
Is this iOS Biometrics authentication spec?
If not, please let me know how to do.
I am having trouble with a special camera application and am posting to ask for your wisdom.
I am implementing a process to shoot 4k video in the AVFoundation framework. The app does not take video, but rather attaches a film camera shutter mechanism in front of the iPhone camera and acquires still images by the film camera shutter release.
For details, please visit the following web site;
https://www.digi-swap.com/
Until the film camera shutter is released, the iPhone camera gains a complete black image. When the shutter of the film camera is released, for example, the shutter speed is 1/30s, the image (light coming through the film camera's lens) for 1/30s is irradiated to the image sensor of the iPhone, and this image is acquired as a still image.
Naturally, there will be insufficient light, so the aperture of the iPhone camera is opened to maximum and a shutter speed of 1/1s or longer is used. Focus is also fixed.
When taking a picture under these conditions, the center of the still image is relatively bright, but the periphery is dark.
I guess the cause of this is a problem with less launching time of the iPhone camera from complete black to 1/30s light reception. In other words, it may be because the camera does not have enough time to capture a clean image.
I am hoping to confirm if my understanding is correct with the engineers who are developing the camera area at Apple, and hearing back from you would be highly appreciated.
Best regards,
Hello,
It is a very simple question. I can create two albums in Photos app, which have the same album names. Is this a bug or not?
For instance, create one album named "test01", then again create the same name album. Two "test01" albums exist.
If I create a code to save an image to "test01" album, there is the same image saved in both albums.
I am afraid if it is a bug. I think the same name album should not be created.
It is highly appreciated if any Apple personnel respond to my question.
Best regards,
In the process of acquiring images asynchronously from a website and displaying them in an imageView in a tableView cell, I am implementing a process to reset the heightConstraint of the imageView to match the aspect ratio of the acquired image.
In this process, if the image is acquired after the cell is created, the height will not change even if the imageView's heightConstraint is reset.
So I ran tableView.reloadRows(at: [indexPath], with: .none) after retrieving the image and it displayed correctly.
Is this correct solution?
Is it correct to assume that if I change the autolayout constraint of an object in a cell after the cell is created, it will not be reflected in the screen display unless I reload the cell?
I am developing a camera app and need to get realtime greyscale video from camera.
I am using AVFoundation framework and set video output like this;
private func setupCamera() {
...
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable as! String : Int(kCVPixelFormatType_32BGRA)]
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
...
}
For captureOutput, I write like this;
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)!
let bytesPerRow = UInt(CVPixelBufferGetBytesPerRow(imageBuffer))
let width = CVPixelBufferGetWidth(imageBuffer)
let height = CVPixelBufferGetHeight(imageBuffer)
let bitsPerCompornent = 8
var colorSpace = CGColorSpaceCreateDeviceGray()
var bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue)
let context = CGContext(data: baseAddress, width: Int(width), height: Int(height), bitsPerComponent: Int(bitsPerCompornent), bytesPerRow: Int(bytesPerRow), space: colorSpace, bitmapInfo: bitmapInfo.rawValue)! as CGContext
let imageRef = context.makeImage()!
let image = UIImage(cgImage: imageRef, scale: 1.0, orientation: UIImage.Orientation.up)
CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
captureImageView.image = image
}
The point is
var colorSpace = CGColorSpaceCreateDeviceGray()
var bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue)
The image I can get is something like 'greyscale video' but not complete video image.
How can I get complete greyscale video?
Please advise me if anyone knows the answer to get greyscale realtime video.
I developed a camera app with AVFoundation framework where long frame duration is required. When I set the following setting,
camera.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
camera.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
it takes several seconds to show up camera preview.
Recently I learned that iPhone uses "Rolling Shutter" and I guess this may cause this delay.
Is my guess correct?
In addition, if I set timescale: 1, then captureOutput(_:didOutput:from:)is not called at all.
Why?
I am using the latest model iPhone 13 pro and this has a capability of 1 - 60 fps range.
Please advise me if anyone knows further details.
I installed Xcode 13.2.1 to my MacBook Air M1 model, then updated my iPhone 13 pro with iOS 15.2.
I opened the Xcode project which was correctly working with previous version then RUN it with iPhone 13 pro which iOS is the latest one 15.2, Xcode does not recognize iOS 15.2.
"Failed to prepare device for development." message is displayed.
It says "This operation can fail if the version of the OS on the device is incompatible with the installed version of Xcode. You may also need to restart your mac and device in order to correctly detect compatibility."
Why?
I ran into a strange problem.
A camera app using AVFoundation, I use the following code;
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInUltraWideCamera, for: AVMediaType.video, position: .back)
then,
let isAutoFocusSupported = captureDevice.isFocusModeSupported(.autoFocus)
"isAutoFocusSupported" should be "true".
For iPhone 13 pro, it is "true".
But for 13 / 13 mini, it is "false".
Why?
My app is a camera app using AVFoundation framework. When it uses UltraWideCamera, its initial zoom magnification is 0.5x. I would like to make it 1.0x.
How can I do this?
Please let me know the swift code to realize it.
Thanks in advance.